Gary Smith
Pomona College
Journal of Statistics Education v.6, n.3 (1998)
Copyright (c) 1998 by Gary Smith, all rights reserved. This text may be freely shared among individuals, but it may not be republished in any medium without express written consent from the author and advance notification of the editor.
Key Words: Activity-based learning; Introductory statistics; Learning by doing; Team projects.
To help students develop statistical reasoning, a traditional introductory statistics course was modified to incorporate a semester-long sequence of projects, with written and oral reports of the results. Student test scores improved dramatically, and students were overwhelmingly positive in their assessment of this new approach.
1 A radical reform of introductory statistics classes has been advocated by many, often motivated by observations similar to Hogg (1991): "students frequently view statistics as the worst course taken in college." Some statisticians believe that the goals of an introductory statistics course should be redirected from mathematical technique to data analysis. Others advocate changes in pedagogy, replacing passively received lectures with hands-on activities.
2 For most of my 25 years teaching statistics, I have felt that my role is to give clear lectures that transmit information from the expert (me) to the novices (students). This narcissistic perspective is flattering, but, I believe, misplaced. The focus should not be on the professor, but rather on the students. The proper question is not, How can I make my lectures more brilliant?, but rather, How can I help my students learn statistics?
3 Several authors have recommended laboratory-based courses, in-class activities, or course-long projects. To allow students to learn statistics by doing statistics, I now supplement my lectures with a sequence of biweekly out-of-class group projects with written and oral reports of the results. The student response has been overwhelmingly positive, converting me from a skeptic to a believer.
4 Many statisticians, including Bradstreet (1996) and Cobb (1991), argue that statistical reasoning should take precedence over statistical methods. Hogg (1991) wrote that, "At the beginning level, statistics should not be presented as a branch of mathematics. Good statistics is not equated with mathematical purity or rigor but is more closely associated with careful thinking."
5 Statistical reasoning is not an irrelevant abstraction. To demonstrate the power, elegance, and even beauty of statistical reasoning, realistic examples from a wide variety of disciplines can persuade students that they are learning critical-thinking skills that can be applied every day and in almost any career. It is important that the motivating examples be real. Students are more easily convinced of the power of statistical reasoning if they see it applied to questions that are interesting and real to them. Tools that are used to answer artificial questions will seem artificial too. In addition, students will remember a real-world question and how we answered it more easily than they will remember a contrived example.
6 The problem with relying on examples done by others is that students remain passive participants and do not experience firsthand the many issues that arise in data collection and analysis. As Hogg (1991) wrote: "Instead of asking students to work on 'old' data, even though real, is it not better to have them find or generate their own data? Projects give students experience in asking questions, defining problems, formulating hypotheses and operational definitions, designing experiments and surveys, collecting data and dealing with measurement error, summarizing data, analyzing data, communicating findings, and planning 'follow-up' experiments suggested by the findings." Similarly, Snee (1993) wrote that the "collection and analysis of data is at the heart of statistical thinking. Data collection promotes learning by experience and connects the learning process to reality."
7 One way to help students develop their statistical reasoning is to incorporate active-learning strategies that allow students to supplement what they have heard and read about statistics by actually doing statistics -- designing studies, collecting data, analyzing their results, preparing written reports, and giving oral presentations.
8 In arguing for experiential learning, Snee (1993) quotes the Chinese proverb, "I hear, I forget. I see, I remember. I do, I understand." Bradstreet (1996) writes that, "Learning is situated in activity. Students who use the tools of their education actively rather than just acquire them build an increasingly rich implicit understanding of the world in which they use the tools and of the tools themselves."
9 Many have proposed ways of actively engaging students in hands-on data collection. Bradstreet (1996) recommends a laboratory-based course. Others endorse in-class activities (Dietz 1993; Gnanadesikan, Scheaffer, Watkins, and Witmer 1997); a single three-week project (Hunter 1977); or a course-long project (Chance 1997; Fillebrown 1994; Ledolter 1995; Mackisack 1994). Nonetheless, Cobb (1993) summarized a dozen NSF grants intended to improve the teaching of statistics, and found that none involved student projects collecting and analyzing data.
10 If it is true that students learn by doing, then a series of out-of-class projects involving a variety of data and statistical tools may be more beneficial than in-class activities that must use very restricted kinds of data and do not allow sustained planning and analysis, or one long project that uses only one or two statistical tools. I consequently now use a semester-long sequence of projects to involve the students in hands-on data collection and analysis. This strategy allows more depth than in-class activities and more variety than course-long projects. To enhance their experiential learning, students prepare written and oral reports of their results.
11 Garfield (1993) discusses the details for implementing cooperative-learning strategies, and researchers have reported success from cooperative practices in introductory statistics classes (Dietz 1993; Jones 1991; Keeler and Steinhorst 1995; Shaughnessy 1977), though most of their cooperative activities are limited to homework and studying. In order to encourage teamwork (and reduce the grading burden), I divide the students in my class of thirty into three-person teams. With six biweekly projects, each member of a three-person team can do two written reports and two oral presentations of the project results. Large statistics classes can be divided into somewhat larger teams, with fewer reports per person. The use of teams fosters cooperative learning, develops team-working skills, and often builds considerable camaraderie.
12 I allow students to form their own three-person teams or partial two-person teams, and I complete the team assignments by literally drawing names out of a hat -- a physical demonstration of random selection! In practice, most students do not know each other well enough to form their own teams, but those who are friends seem to appreciate the opportunity to be teammates. Unless there are irreconcilable differences, the teams stay together all semester.
13 To discourage free riding, I announce at the beginning of the semester that I will ask each student at the end of the semester to evaluate each teammate's contribution to the team projects, and that I may adjust a student's project grades based on these reports. This evaluation form states: "Ideally, each team member will do an equal amount of work on team projects, helping others as much as the others help him or her. Please circle the comment that seems most apt for this person: does more than the other team members put together; helps us more than we help him or her; does a third of the team work, a fair share; contributes, but not as much as the other team members; contributes little or nothing to the team."
14 At the beginning of the semester, each team is given six biweekly projects to do over the course of the semester, each with a specific due date. The projects are staggered so that, in any given week, half of the teams are completing projects. The sequence of projects matches the coverage of the course material: a topic covered in class one week shows up in the projects due the following week.
15 Some projects involve a search of the library or the World Wide Web for data, some a polling of students or professors, and some data collected in other ways (for example, free throws by basketball players). The Appendix describes 20 of the 60 projects that I assigned in the spring of 1998 and (in brackets) the anticipated type of data, analysis, and results. One team's assignment in the spring of 1998 consisted of projects 1, 2, 3, 11, 15, and 19. Many of the projects were inspired by student term papers from previous semesters, others from conversations with colleagues. I have used many projects more than once; others have been replaced or rotated as I've expanded my collection of possible projects.
16 With my prior approval, teams can modify or replace any of the projects; in practice, projects are seldom changed. The team members can discuss their project with me or the teaching assistant; we do not help collect data, but we will answer questions about project design and data analysis.
17 To keep the workload reasonable, I no longer assign a semester-long term paper, and I have reduced the number of homework exercises each week from ten to five. We cover the same material in the same or similar textbook (Smith 1991, then Smith 1998), as my intent is to use projects not to teach different topics, but to aid student learning.
18 We can often improve and test our understanding of a subject by writing about it. For instance, some of my best students rewrite their lecture notes as essays in their own words. The process of writing about a subject can clarify and reinforce understanding. In addition, by attempting to rewrite the lecture in their own words, they can learn what parts of the lecture they don't really understand and consequently need to figure out.
19 In a statistics class, written reports of project results can be used as nontrivial writing assignments that help students to learn statistics, improve their writing skills, and overcome the preconception that statistics is just plug-and-chug. In my class, the team as a whole is responsible for the project analysis and implementation, but individual team members take turns writing the project reports. Thus each project receives two grades, an analysis grade that is given to every team member and a writing grade that is given to the report's author. During the course of the semester, each student writes two project reports. From the professor's viewpoint, two brief project reports are less burdensome to grade than a semester-long term paper.
20 I grade each report not only on the analysis, but also on whether the writing is clear, persuasive, and grammatically correct. When I hand out the initial project assignments, I include a page of instructions for preparing the reports and a page of grammar reminders. The instructions include this advice:
The purpose of your report is to explain your project's objectives, how you obtained your data, the inferences you draw from your data, and any reservations you have about your conclusions. It should be fair, honest, and interesting -- the kind of report that you would find informative and enjoy reading.
Any relevant data that you use should be included as an appendix to your report; this appendix can be handwritten as long as it is clear, clean, and readable. Data that are used in the body of your report should be presented clearly and effectively in tables or graphs.
The body of your report must be typed and should use clear, concise, and persuasive prose. It should be long enough to make the points you are trying to make, but not so long that the reader becomes bored. Not counting appendixes, tables, and graphs, I will be surprised if your report has fewer than 3 or more than 5 pages.
You can use the first person and a conversational tone ("We settled in at the library and started flipping pages, looking for the right data and trying not to be distracted by articles about OJ Simpson."), but don't be sloppy or use excessive slang ("We finally found the libary and some mags, but were bummed by all all the OJ arcticles."). I will deduct points for typographical and grammatical errors. I will add points for writing that holds the reader's attention while still being serious, not silly.
21 The grammar reminders include these tips:
22 In class, I say forcefully that statistics can be used to inform, but we need to communicate that information effectively. Excess words and passive constructions are particularly deadly. I suggest that once students have what they consider a final draft, they read it one more time looking for unnecessary words that can be trimmed. I also strongly recommend that the team members help one another by reading each other's reports, as peer evaluations often help both the author and the reader (Stromberg and Ramanathan 1996).
23 I post a few especially well-written papers on the course web site. Students seem to appreciate these very concrete examples of what the professor values, and to take pride in having their papers selected as model reports.
24 Many students graduate from college having had no instruction or practice in public speaking, and, indeed, harboring a deep dread of having to speak to an audience. When asked, five or ten years after graduation, what they wished they had learned in college, to speak effectively and without fear is often near the top of the list.
25 I consequently now ask students to make brief oral presentations (five minutes maximum) of their project results. To encourage teamwork, the person who gives the oral report cannot be the person who authored the written report. These presentations can utilize handouts, an overhead projector, and other visual aids.
26 In a class of thirty, ten teams making biweekly oral reports take up about 20 to 30 minutes of classroom time a week. My hope is that these lost lecture minutes will be more than offset by the hours students spend outside class learning statistics by working on their projects.
27 The fear of the acute embarrassment that would result from an inaccurate, disorganized, or incoherent oral presentation provides a tremendous motivation to prepare adequately. In addition, attempts to express concepts in our own words can help us understand these ideas better and retain them longer. Writing assignments are one way to do this; speaking assignments are another. Oral reports can not only help students develop the ability to speak coherently and persuasively, but can also help them learn statistics.
28 I try to give students a few tips on how to be effective speakers. I tell them that we are all prone to nervous habits (fiddling with a button, putting a hand in a pocket, saying "um") that distract listeners and signal the speaker's nervousness. Speakers are usually unaware of these habits, and one of our jobs as a supportive classroom audience is to alert them to these problems. I also tell students that audiences have more confidence in speakers who don't rely much on notes: someone who reads a speech may be just reciting what someone else wrote. A memorized speech can have the same effect. The goal is to give an extemporaneous speech that tells the audience that the speaker knows the material and is expressing it in his or her own words. I also advise students to have lots of eye contact with individual members of the audience, instead of looking at notes, the floor, or the back of the room.
29 Just as the development of good writing skills requires useful feedback, so does the development of good speaking skills. At the conclusion of each presentation, I give the class a few moments to write down constructive suggestions, which I collect and give to the speaker at the end of class. If everyone says "slow down" or "speak up," the speaker will know this is a serious problem. This exercise also encourages everyone to think about what works and what doesn't. I make written suggestions too and grade each oral presentation. One enlightening practice is to write especially popular phrases (such as "um" and "basically") at the top of the page and tabulate how many times these are used by the student.
30 The use of a sequence of projects gives students an opportunity to improve their speaking abilities after receiving this feedback. Obviously, people cannot become effective speakers by giving two oral presentations, any more than they could become effective authors by writing two papers. Instead, these oral presentations should be viewed as opportunities to nurture and develop skills that will be honed over a lifetime.
31 Because thinking on one's feet is an important objective, I tell students to ask each speaker challenging questions. If the questions lag, I fire away. Even "dumb" questions can be useful, as they force the speaker to explain things differently and perhaps more clearly. Although they are not rewarded directly for asking questions, students find spirited exchanges among the speaker and the audience to be not only beneficial, but a great deal of fun. I have also noticed that students tend to ask tougher questions of the more accomplished or arrogant speakers and to take it easy on those who are struggling.
32 Writing and speaking assignments are more difficult to grade than traditional computational questions, particularly since the latter can often be handled by multiple-choice tests. However, a student's success in meeting a course's objectives should be measured by authentic assessment techniques (Chance 1997; Garfield 1993). If we want students to understand and communicate statistical results, then their course grade should depend substantially on how well they do so. Cobb (1993) writes that, "Once we accept that assessment must be authentic, the most radical implication of TQM is that the entire course should be built of assessment tasks."
33 I do not go this far, but 40% of the course grade is now based on the team projects, 15% on homework exercises, 15% on the midterm, and 30% on the final examination. This compromise is intended to communicate to students the importance of the team projects, while also signaling that the projects are a means to an end -- that I expect them to learn the traditional material and will test them on that material in a traditional way.
34 For the team projects, each student accumulates 10 separate grades during the semester, each worth 4% of the course grade: six team grades on project design and analysis; two individual grades on written reports, and two individual grades on oral reports. Each of these is a letter grade (A, A-, B+, and so on), which is converted to a numerical equivalent when the overall course grade is calculated.
35 The first time that I incorporated team projects into a statistics course, I gave the students an anonymous survey on the last class day that was collected by another student and given to me after the semester grades had been recorded. There were four questions on this survey. The first was as follows:
This semester, I am trying an experiment. I reduced the number of homework exercises each week from 10 to 5 and eliminated the semester term paper. Instead, the class has been divided into 3-person teams to work on 6 biweekly projects, with each person responsible for 2 written essays and 2 oral presentations. Please advise me about how this experiment worked and how it might be improved. Do not sign your name and please be honest and candid.
Overall, I feel that this new format is
a terrible idea a bad idea a good idea a great idea
Of the 30 students taking the class, 6 said it was a good idea and 24 said it was a great idea.
36 The other questions were open-ended: "What I like most about this course is," "What I like least about this course is," and "Suggestions for improving this course." All of the answers were positive and gratifying. The projects were mentioned often in the answers to the second question. The third and fourth questions were either left blank or answered positively. Here are some of the comments:
37 Examination scores also improved dramatically. Classes are not random samples, but I did not announce the change in format ahead of time and there was no apparent reason for a systematic change in the students who enrolled the first time that I taught the course with projects. The tests were of similar difficulty and covered the same chapters in the same textbook. Comparisons are now more problematic because I have since changed textbooks (from Smith 1991 to Smith 1998) and students know of the new format when they register for the course.
38 The last semester that I taught the course without team projects, the scores on the midterm examination had a mean of 80.79 and a standard deviation of 16.00, and the scores on the final examination had a mean of 80.27 and a standard deviation of 12.56. These are very close to the average scores in this class for the past 10 years.
39 Taught with projects, the scores on the midterm had a mean of 92.13 and a standard deviation of 6.96. This is the highest average score that I have ever had in a statistics class. I tried to make the final examination tougher in order to decompress the scores, but was only partly successful. The mean was 88.12 and the standard deviation 8.28.
40 Figure 1 shows boxplots of the test scores. Particularly noteworthy is the less-frequent occurrence of very low scores. In addition, no one dropped the project-based course; the previous semester, two students (a typical number) dropped the course after the midterm.
Figure 1. Boxplots for Two Adjacent Semesters.
41 Heartened by these test scores and the student enthusiasm, I am persuaded that a sequence of team projects with written reports and oral presentations is a promising approach that may help students learn statistics and also to write and speak more effectively. Students appear to be aware of these benefits and to appreciate the opportunity to develop these skills.
Bradstreet, T. E. (1996), "Teaching Introductory Statistics Courses So That Nonstatisticians Experience Statistical Reasoning", The American Statistician, 50, 69-78.
Chance, B. L. (1997), "Experiences with Authentic Assessment Techniques in an Introductory Statistics Course," Journal of Statistics Education, [Online], 5(3). (http://www.amstat.org/publications/jse/v5n3/chance.html)
Cobb, G. W. (1991), "Teaching Statistics: More Data, Less Lecturing," Amstat News, December, No. 182, 1 and 4.
----- (1993), "Reconsidering Statistics Education: A National Science Foundation Conference," Journal of Statistics Education, [Online], 1(1). (http://www.amstat.org/publications/jse/v1n1/cobb.html)
Dietz, E. J. (1993), "A Cooperative Learning Activity on Methods of Selecting a Sample," The American Statistician, 47, 104-108.
Fillebrown, S. (1994), "Using Projects in an Elementary Statistics Course for Non-Science Majors," Journal of Statistics Education, [Online], 2(2). (http://www.amstat.org/publications/jse/v2n2/fillebrown.html)
Garfield, J. (1993), "Teaching Statistics Using Small-Group Cooperative Learning," Journal of Statistics Education, [Online], 1(1). (http://www.amstat.org/publications/jse/v1n1/garfield.html)
Gnanadesikan, M., Scheaffer, R. L., Watkins, A. E., and Witmer, J. A. (1997), "An Activity-Based Statistics Course," Journal of Statistics Education, [Online], 5(2). (http://www.amstat.org/publications/jse/v5n2/gnanadesikan.html)
Hogg, R. V. (1991), "Statistical Education: Improvements Are Badly Needed," The American Statistician, 45, 342-343.
Hunter, W. G. (1977), "Some Ideas About Teaching Design of Experiments, with 2^5 Examples of Experiments Conducted by Students," The American Statistician, 31, 12-17.
Jones, L. (1991), "Using Cooperative Learning to Teach Statistics," Research Report Number 91-2, The L. L. Thurstone Psychometric Laboratory, University of North Carolina.
Keeler, C. M., and Steinhorst, R. K. (1995), "Using Small Groups to Promote Active Learning in the Introductory Statistics Course: A Report from the Field," Journal of Statistics Education, [Online], 3(2). (http://www.amstat.org/publications/jse/v3n2/keeler.html)
Ledolter, J. (1995), "Projects in Introductory Statistics Courses," The American Statistician, 49, 364-367.
Mackisack, M. (1994), "What Is the Use of Experiments Conducted By Statistics Students?," Journal of Statistics Education, [Online], 2(1). (http://www.amstat.org/publications/jse/v2n1/mackisack.html)
Shaughnessy, J. M. (1977), "Misconceptions of Probability: An Experiment With a Small-Group Activity-Based Model Building Approach to Introductory Probability at the College Level," Educational Studies in Mathematics, 8, 285-315.
Snee, R. D.(1993), "What's Missing in Statistical Education?", The American Statistician, 47, 149-154.
Smith, G. (1991), Statistical Reasoning (3rd ed.), Boston: Allyn and Bacon.
----- (1998), Introduction to Statistical Reasoning, Boston: WCB/McGraw-Hill.
Stromberg, A. J., and Ramanathan, S. (1996), "Easy Implementation of Writing in Introductory Statistics Courses, The American Statistician, 50, 159-163.
Gary Smith
Pomona College
Claremont, CA 91711